Perceptron-like large margin classifiers
نویسنده
چکیده
We consider perceptron-like algorithms with margin in which the standard classification condition is modified to require a specific value of the margin in the augmented space. The new algorithms are shown to converge in a finite number of steps and used to approximately locate the optimal weight vector in the augmented space following a procedure analogous to Bolzano’s bisection method. We demonstrate that as the data are embedded in the augmented space at a larger distance from the origin the maximum margin in that space approaches the maximum geometric one in the original space. Thus, our algorithmic procedure could be regarded as an approximate maximal margin classifier. An important property of our method is that the computational cost for its implementation scales only linearly with the number of training patterns.
منابع مشابه
The Role of Weight Shrinking in Large Margin Perceptron Learning
We introduce into the classical perceptron algorithm with margin a mechanism that shrinks the current weight vector as a first step of the update. If the shrinking factor is constant the resulting algorithm may be regarded as a margin-error-driven version of NORMA with constant learning rate. In this case we show that the allowed strength of shrinking depends on the value of the maximum margin....
متن کاملFlexible Margin Selection for Reranking with Full Pairwise Samples
Perceptron like large margin algorithms are introduced for the experiments with various margin selections. Compared to the previous perceptron reranking algorithms, the new algorithms use full pairwise samples and allow us to search for margins in a larger space. Our experimental results on the data set of (Collins, 2000) show that a perceptron like ordinal regression algorithm with uneven marg...
متن کاملNearest Neighbors with Learned Distances for Phonetic Frame Classification
Nearest neighbor-based techniques provide an approach to acoustic modeling that avoids the often lengthy and heuristic process of training traditional Gaussian mixturebased models. Here we study the problem of choosing the distance metric for a k-nearest neighbor (k-NN) phonetic frame classifier. We compare the standard Euclidean distance to two learned Mahalanobis distances, based on large-mar...
متن کاملOnline Learning of Approximate Maximum Margin Classifiers with Biases
We consider online learning of linear classifiers which approximately maximize the 2-norm margin. Given a linearly separable sequence of instances, typical online learning algorithms such as Perceptron and its variants, map them into an augmented space with an extra dimension, so that those instances are separated by a linear classifier without a constant bias term. However, this mapping might ...
متن کاملPerceptron-like Algorithms and Generalization Bounds for Learning to Rank
Learning to rank is a supervised learning problem where the output space is the space of rankings but the supervision space is the space of relevance scores. We make theoretical contributions to the learning to rank problem both in the online and batch settings. First, we propose a perceptron-like algorithm for learning a ranking function in an online setting. Our algorithm is an extension of t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2007